Information retrieval evaluation

Results: 457



#Item
331Relevance feedback / Web search query / Relevance / IR evaluation / Precision and recall / MORE / Query expansion / Discounted cumulative gain / Text Retrieval Conference / Information science / Information retrieval / Science

TREC 2013 Web Track Overview Kevyn Collins-Thompson University of Michigan Paul Bennett, Fernando Diaz Microsoft Research

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2014-02-28 12:39:27
332Relevance feedback / Query expansion / Language model / Precision and recall / Tf*idf / Relevance / Data model / Search engine indexing / Information science / Information retrieval / Science

UMass at TREC 2013 Knowledge Base Acceleration Track: Bi-directional Entity Linking and Time-aware Evaluation Laura Dietz University of Massachusetts, Amherst [removed]

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2014-02-28 12:39:27
333Query expansion / Relevance feedback / Tf*idf / Relevance / Twitter / Information science / Information retrieval / Okapi BM25

Tie-breaker: A New Perspective of Ranking and Evaluation for Microblog Retrieval Yue Wang, Jerry Darko, and Hui Fang Department of Electrical and Computer Engineering University of Delaware 140 Evans Hall, Newark, Delawa

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2014-02-28 12:39:27
334Evaluation / Biostatistics / Information retrieval / Standards-based education / Design of experiments / Type I and type II errors / Accuracy and precision / Gain / Standardized test / Statistics / Knowledge / Education

The Tipping Point: Understanding the Tradeoffs Associated with Teacher Misclassification in High Stakes Personnel Decisions

Add to Reading List

Source URL: cepa.stanford.edu

Language: English - Date: 2013-04-18 11:39:31
335Relevance / Crowdsourcing / Precision and recall / IR evaluation / National Institute of Standards and Technology / Text Retrieval Conference / Relevance feedback / Information science / Information retrieval / Science

Overview of the TREC 2013 Crowdsourcing Track Mark D. Smucker1 , Gabriella Kazai2 , and Matthew Lease3 1 Department of Management Sciences, University of Waterloo 2

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2014-02-28 12:39:25
336Learning to rank / Query expansion / Search engine indexing / Search engine / Web search engine / Google Search / Bing / Precision and recall / IR evaluation / Information science / Information retrieval / Text Retrieval Conference

Overview of the TREC 2013 Federated Web Search Track Thomas Demeester1 , Dolf Trieschnigg2 , Dong Nguyen2 , Djoerd Hiemstra2 1 2

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2014-02-28 12:39:25
337Natural language processing / Information retrieval / Linguistics / Relevance / Speech recognition / DARPA / Named-entity recognition / Text Retrieval Conference / Computational linguistics / Science / Information science

Automatic language and information processing: rethinking evaluation ∗ Karen Sparck Jones Computer Laboratory, University of Cambridge New Museums Site, Pembroke Street, Cambridge CB2 3QG, UK [removed]

Add to Reading List

Source URL: www.cl.cam.ac.uk

Language: English - Date: 2007-01-08 06:55:10
338Concept Search / Text Retrieval Conference / Zubulake v. UBS Warburg / Discovery / John M. Facciola / Westlaw / Relevance feedback / Electronic discovery / Precision and recall / Information science / Information retrieval / Science

Artificial Intelligence and Law manuscript No. (will be inserted by the editor) Evaluation of Information Retrieval for E-Discovery Douglas W. Oard · Jason R. Baron · Bruce Hedin · David D. Lewis · Stephen Tomlinson

Add to Reading List

Source URL: terpconnect.umd.edu

Language: English - Date: 2010-06-27 23:59:20
339Relevance feedback / Ranking function / Precision and recall / Search engine indexing / IR evaluation / Information science / Information retrieval / Web query classification

Million Query Track 2009 Overview Ben Carterette∗, Virgil Pavlu†, Hui Fang‡, Evangelos Kanoulas§ The Million Query Track ran for the third time in[removed]The track is designed to serve two purposes: first, it is an

Add to Reading List

Source URL: trec.nist.gov

Language: English - Date: 2010-08-18 12:34:13
340Technical communication / Information / Data / Technology / Data management / Knowledge representation / Metadata

2014 TRECVID MULTIMEDIA EVENT DETECTION & RECOUNTING EVALUATION PLAN This   is   the   evaluation   plan   for   Multimedia   Event   Detection   and   Retrieval   (MED)   and   Mul

Add to Reading List

Source URL: www.nist.gov

Language: English - Date: 2014-05-15 14:12:28
UPDATE